首页> 外文OA文献 >Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
【2h】

Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

机译:用于大规模的高效随机坐标下降算法   结构化非凸优化

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In this paper we analyze several new methods for solving nonconvexoptimization problems with the objective function formed as a sum of two terms:one is nonconvex and smooth, and another is convex but simple and its structureis known. Further, we consider both cases: unconstrained and linearlyconstrained nonconvex problems. For optimization problems of the abovestructure, we propose random coordinate descent algorithms and analyze theirconvergence properties. For the general case, when the objective function isnonconvex and composite we prove asymptotic convergence for the sequencesgenerated by our algorithms to stationary points and sublinear rate ofconvergence in expectation for some optimality measure. Additionally, if theobjective function satisfies an error bound condition we derive a local linearrate of convergence for the expected values of the objective function. We alsopresent extensive numerical experiments for evaluating the performance of ouralgorithms in comparison with state-of-the-art methods.
机译:在本文中,我们分析了几种解决非凸优化问题的新方法,其目标函数由两个项之和形成:一个是非凸且光滑,另一个是凸但简单且其结构已知。此外,我们考虑两种情况:无约束和线性约束的非凸问题。针对上述结构的优化问题,提出了随机坐标下降算法,并对其收敛性进行了分析。对于一般情况,当目标函数为非凸和复合时,我们证明了算法生成的序列的渐近收敛性,期望达到最优点,从而达到稳定点和亚线性收敛率。此外,如果目标函数满足误差限制条件,则可以得出目标函数期望值的局部线性收敛速度。我们还提供了广泛的数值实验,用于与最新方法进行比较来评估我们的算法的性能。

著录项

  • 作者

    Patrascu, A.; Necoara, I.;

  • 作者单位
  • 年度 2014
  • 总页数
  • 原文格式 PDF
  • 正文语种 {"code":"en","name":"English","id":9}
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号